Generative Local Metric Learning for Nadaraya-Watson Kernel Estimation
نویسندگان
چکیده
• For Gaussian data, the regression needs to calculate kernel with only two-dimensional information regardless of the original dimensionality, and it always obtains mean square error (MSE) close to zero with finite samples. • Previously a well-known extension of the NW estimator, locally linear regression (LLR), is known to alleviate the asymptotic bias. The proposed metric acts differently from LLR to reduce the bias, and the proposed method highly outperforms LLR in the empirical experiments. • In order to reduce MSE, reducing variance with metric is not important. The asymptotic contribution of the variance shows little dependency on the metric once appropriate bandwidth is selected in a high dimensional space. • A simple and coarse generative model captures appropriate metric for nonparametric NW estimation.
منابع مشابه
Evolutionary kernel density regression
The Nadaraya–Watson estimator, also known as kernel regression, is a density-based regression technique. It weights output values with the relative densities in input space. The density is measured with kernel functions that depend on bandwidth parameters. In this work we present an evolutionary bandwidth optimizer for kernel regression. The approach is based on a robust loss function, leave-on...
متن کاملOn the Adaptive Nadaraya-watson Kernel Regression Estimators
Nonparametric kernel estimators are widely used in many research areas of statistics. An important nonparametric kernel estimator of a regression function is the Nadaraya-Watson kernel regression estimator which is often obtained by using a fixed bandwidth. However, the adaptive kernel estimators with varying bandwidths are specially used to estimate density of the long-tailed and multi-mod dis...
متن کاملSequential Fixed-width Confidence Bands for Kernel Regression Estimation
We consider a random design model based on independent and identically distributed (iid) pairs of observations (Xi, Yi), where the regression function m(x) is given by m(x) = E(Yi|Xi = x) with one independent variable. In a nonparametric setting the aim is to produce a reasonable approximation to the unknown function m(x) when we have no precise information about the form of the true density, f...
متن کاملLarge and moderate deviations principles for kernel estimators of the multivariate regression
Abstract : In this paper, we prove large deviations principle for the Nadaraya-Watson estimator and for the semi-recursive kernel estimator of the regression in the multidimensional case. Under suitable conditions, we show that the rate function is a good rate function. We thus generalize the results already obtained in the unidimensional case for the Nadaraya-Watson estimator. Moreover, we giv...
متن کاملWeighted Nadaraya-Watson Regression Estimation
In this article we study nonparametric estimation of regression function by using the weighted Nadaraya-Watson approach. We establish the asymptotic normality and weak consistency of the resulting estimator for-mixing time series at both boundary and interior points, and we show that the estimator preserves the bias, variance, and more importantly, automatic good boundary behavior properties of...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015